Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract In this paper, we study the largest eigenvalues of sample covariance matrices with elliptically distributed data. We consider the sample covariance matrix $$Q=YY^{*},$$ where the data matrix $$Y \in \mathbb{R}^{p \times n}$$ contains i.i.d. $$p$$-dimensional observations $$\textbf{y}_{i}=\xi _{i}T\textbf{u}_{i},\;i=1,\dots ,n.$$ Here $$\textbf{u}_{i}$$ is distributed on the unit sphere, $$\xi _{i} \sim \xi $$ is some random variable that is independent of $$\textbf{u}_{i}$$ and $$T^{*}T=\varSigma $$ is some deterministic positive definite matrix. Under some mild regularity assumptions on $$\varSigma ,$$ assuming $$\xi ^{2}$$ has bounded support and certain decay behaviour near its edge so that the limiting spectral distribution of $$Q$$ has a square root decay behaviour near the spectral edge, we prove that the Tracy–Widom law holds for the largest eigenvalues of $$Q$$ when $$p$$ and $$n$$ are comparably large. Based on our results, we further construct some useful statistics to detect the signals when they are corrupted by high dimensional elliptically distributed noise.more » « less
-
Free, publicly-accessible full text available December 3, 2025
-
Abstract We establish a new perturbation theory for orthogonal polynomials using a Riemann–Hilbert approach and consider applications in numerical linear algebra and random matrix theory. This new approach shows that the orthogonal polynomials with respect to two measures can be effectively compared using the difference of their Stieltjes transforms on a suitably chosen contour. Moreover, when two measures are close and satisfy some regularity conditions, we use the theta functions of a hyperelliptic Riemann surface to derive explicit and accurate expansion formulae for the perturbed orthogonal polynomials. In contrast to other approaches, a key strength of the methodology is that estimates can remain valid as the degree of the polynomial grows. The results are applied to analyze several numerical algorithms from linear algebra, including the Lanczos tridiagonalization procedure, the Cholesky factorization, and the conjugate gradient algorithm. As a case study, we investigate these algorithms applied to a general spiked sample covariance matrix model by considering the eigenvector empirical spectral distribution and its limits. For the first time, we give precise estimates on the output of the algorithms, applied to this wide class of random matrices, as the number of iterations diverges. In this setting, beyond the first order expansion, we also derive a new mesoscopic central limit theorem for the associated orthogonal polynomials and other quantities relevant to numerical algorithms.more » « less
An official website of the United States government
